Part of a series on: |
Video game industry |
---|
Activities/jobs
|
Types
|
Topics
|
Lists
|
The video game industry (often referred to as interactive entertainment) is the economic sector involved with the development, marketing and sale of video games. It encompasses dozens of job disciplines and employs thousands of people worldwide.
Contents |
Considered by some as a curiosity in the mid-1970s, the computer and video game industries have grown from focused markets to mainstream. They took in about USD$9.5 billion in the US in 2007, and 11.7 billion in 2008 (ESA annual report).
Modern personal computers owe many advancements and innovations to the game industry: sound cards, graphics cards and 3D graphic accelerators, CD ROM and DVD-ROM drives, Unix and CPUs are a few of the more notable improvements. Unix in particular was developed in part so that the programmers could play a space traveling game.[1][2]
Sound cards were developed for addition of digital-quality sound to games and only later improved for music and audiophiles.[3] Early on graphics cards were developed for more colors. Later, graphics cards were developed for graphical user interfaces (GUIs) and games. GUIs drove the need for high resolution, games drove 3D acceleration. They also are one of the only pieces of hardware to allow multiple hookups (such as with SLI or CrossFire graphics cards). CD- and DVD-ROMs were developed for mass distribution of media in general, however the ability to store more information on cheap easily distributable media was instrumental in driving their ever higher speeds.
Modern games are among the most demanding of applications on PC resources. Many of the high powered personal computers are purchased by gamers who want the fastest equipment to power the latest cutting-edge games. Thus, the inertia of CPU development is due in part to this industry whose games demand faster processors than business or personal applications.
Ben Sawyer of Digitalmill observes that the game industry value chain is made up of six connected and distinctive layers:
The game industry employs those experienced in other traditional businesses, but some have experience tailored to the game industry. For example, many recruiters target just game industry professionals. Some of the disciplines specific to the game industry include: game programmer, game designer, level designer, game producer, game artist and game tester. Most of these professionals are employed by video game developers or video game publishers. However, many hobbyists also produce computer games and sell them commercially.
By the late 1970s, the computer game industry formed from a hobby culture, when personal computers just began to become widely available. The industry grew along with the advancement of computing technology, and often drove that advancement.
In the mid 1980s the industry crashed due to the production of too many badly developed games (quantity over quality), resulting in the fall of the North American industry and giving rise to the Japanese industry, particularly the games company Nintendo.[4]
The 1990s saw advancements in game related technology. Among the significant advancements were:
Today, the video game industry is a juggernaut of development; profit still drives technological advancement which is then used by other industry sectors. Though maturing, the video game industry is still very volatile, with third-party video game developers quickly cropping up, and just as quickly, going out of business.
Early on, development costs were minimal, and video games could be quite profitable. Games developed by a single programmer, or by a small team of programmers and artists, could sell hundreds of thousands of copies each. Many of these games only took a few months to create, so developers could release several titles each year. Thus, publishers could often be generous with benefits, such as royalties on the games sold. Many early game publishers started from this economic climate, such as Origin Systems, Sierra Entertainment, Capcom, Activision and Electronic Arts.
As computing and graphics power increased, so too did the size of development teams, as larger staffs were needed to address the ever increasing graphical and programming complexities. Now budgets can easily reach millions of dollars, even if middleware and pre-built game engines are used. Most professional games require one to three years to develop, further increasing the strain on budgets.
Some developers are turning to alternative production and distribution methods, such as online distribution, to reduce costs.
Today, the video game industry has a major impact on the economy through the sales of major systems and games such as, Grand Theft Auto IV, which took in over USD$500 million in sales during its opening week.[5] The game's income was more than the opening weekend of Spider-Man 3 and the previous title holder for a video game Halo 3.[6] Many individuals have also benefited from the economic success of video games including the former chairman of Nintendo and Japan's third richest man: Hiroshi Yamauchi.[7]
The video game industry is currently facing financial strains as it attempts to fairly compensate its talent, while continuing to turn a profit. The result is that the game developer—the traditional source of new games—is essentially dying out or is being incorporated into large publishers. The game industry is currently experiencing a phase of consolidation and vertical integration as a reaction to spiraling costs. This climate has also given birth to vibrant indie game developers comprising tiny companies trying to use the internet rather than traditional retail channels to reach an audience.
Video game industry practices are similar to those of other entertainment industries (e.g. the music recording industry), but the video game industry in particular has been accused of treating its development talent poorly. This promotes independent development, as developers leave to form new companies and projects. In some notable cases, these new companies grow large and impersonal, having adopted the business practices of their forebears, and ultimately perpetuate the cycle.
However, unlike the music industry, where modern technology has allowed a fully professional product to be created extremely inexpensively by an independent musician, modern games require increasing amounts of manpower and equipment. This dynamic makes publishers, who fund the developers, much more important than in the music industry.
A particularly famous case is the "original" independent developer Activision, founded by former Atari developers. Activision grew to become the world's second largest game publisher.[8] In the mean time, many of the original developers left to work on other projects. For example, founder Alan Miller left Activision to start another video game development company, Accolade (now Atari née Infogrames).
Activision was popular among developers for giving them credit in the packaging and title screens for their games, while Atari disallowed this practice. As the video game industry took off in the mid-80s, many developers faced the more distressing problem of working with fly-by-night or unscrupulous publishers that would either fold unexpectedly or run off with the game profits.
In 2004, the U.S. game industry as a whole was worth USD$10.3 billion.[9] However, economic problems remain today with regard to publisher-developer contracts (see copyright: transfer of rights). Typically, developers receive a royalty of around 20% of the sales profits, and the rest goes to the publisher. Rather than dividing royalties, many publishers buy the development studio outright. Some developers begrudge the tendency for the studio's original management to leave in the wake of a buyout, while the remaining employees try to finish the project only to be shut down after a few years. These buyouts often result in a big push to finish video game projects in time for the holiday purchasing season, and transfer of creative control to the publisher.
Some people disapprove of publishers having creative control since they are more apt to follow short-term market trends rather than invest in risky but potentially lucrative ideas. On the other hand, publishers may know better than developers what consumers want. The relationship between video game developers and publishers parallels the relationship between recording artists and record labels in many ways. But unlike the music industry, which has seen flat or declining sales in the early 2000s [10][11][12], the video game industry continues to grow.[13] Also, personal computers have made the independent development of music almost effortless, while the gap between an independent game developer and the product of a fully financed one grows larger.
In the computer games industry, it is easier to create a startup, resulting in many successful companies. The console games industry is a more closed one, and a game developer must have up to three licenses from the console manufacturer:
In addition, the developer must usually buy development systems from the console manufacturer in order to even develop a game for consideration, as well as obtain concept approval for the game from the console manufacturer. Therefore, the developer normally has to have a publishing deal in place before starting development on a game project, but in order to secure a publishing deal, the developer must have a track record of console development, something which few startups will have.
An alternative method for publishing video games is to self-publish using the shareware or open source model over the Internet. However, it remains to be seen whether freely made and distributed games can survive in the era of multi-million dollar production.
The Japanese video game industry is markedly different from the industry in North America and Europe.
Generally, games have a greater market share of total entertainment in Japan than in the West. Japan has created some of the largest and most expensive titles ever made, such as the Shenmue, Final Fantasy, Metal Gear and Mario series of games.
Video game arcades are still relatively popular in Japan; for every arcade game released in the US, nine are released in Japan. The history of the Japanese arcade is very significant in the story of the decline of the American arcade, and in the shape of game design in general. In particular, the arcade scene in Japan has caused them to lag behind in the field of sound effects and sound design, because this is less important in an arcade. For example, a modern game like Tekken 4 still uses 16 kHz samples like the original arcade release.
Consoles and arcade games are the main media for Japanese game design; PC games are nowhere near as popular. This necessarily dictates that there are fewer independently developed games coming from Japan, as it is much harder to develop independently for a console than it is for a PC.
The structure and culture of a Japanese game developer is different from a western one. Throughout the history of Japanese game design, many developers have seen fit to remain mostly anonymous, even using pseudonyms to a large degree in video game credits.
Also, the division in labor for video game development is different. For example, Japanese game design teams had a dedicated designer (which they called a "director") much earlier than American design teams adopted the practice. Secondly, Japanese game designers throughout history generally had far more people working on a particular game than a comparable western design team. For example, Street Fighter 2 (1991), a Japanese title, had almost one artist working on every character in the game, plus two programmers and a musician, with the result being a team of twenty or more people. Mortal Kombat (1992), a comparable American title, was developed by four people: a programmer, an artist, a musician, and a background artist.
The UK industry is the third largest in the world in terms of developer success and sales of hardware and software by country alone. The country houses 23 of the top 100 successful studios in the world.[14] In recent years many of the studios have become defunt or sold on to other rival companies such as LittleBigPlanet developer, Media Molecule[15] and Codemasters.[16] Though the country houses many of the world's most successful franchises such as Grand Theft Auto, Fable, DiRT and Total War, the country is trying to find its identity, with many of the games without any cultural influence coming from the UK, with most influence derived from North America.
The country also goes without tax relief, which means most of the talented development within the UK will move overseas for more profit, along with parents of certain video game developers which would pay for having games developed in the UK.[17] Though in its current state it is large, at the rate the industry in the country is going it will lead to the country being moved down in the rankings and falling behind countries such as France and, most significantly, Canada.
A fairly recent practice, since the mid-1990s, of the video game industry is the rise of game players as developers of game content. The rise of video game players as fourth-party developers of game content allows for more open source models of game design, development and engineering. Game players create user modifications (mods), which in some cases become just as popular, maybe even more popular, as the original game created. An example of this is the game Counter-strike, which began as a mod of the video game Half-Life and eventually became a published game that was very successful.
While this "community of modifiers" may only add up to approximately 1% of a particular game's user base, the number of those involved will grow as more games offer modifying opportunities (such as, by releasing source code) and as the international community of gamers rise. According to Ben Sawyer, as many as 600,000 established online game community developers will exist by 2012. This will effectively add a new component to the game industry value chain and if it continues to mature it will integrate itself into the overall industry.[4]